Learning Multiple Classifiers with Dirichlet Process Mixture Priors

نویسندگان

  • Ya Xue
  • Xuejun Liao
  • Lawrence Carin
  • Balaji Krishnapuram
چکیده

Introduction A real world classification task can often be viewed as consisting of multiple subtasks. In remote sensing, for example, one may have multiple sets of radar images, each collected at a particular geographical location, with the aim of designing classifiers for detecting objects of interest in images at all locations. In this situation, one can either learn a single classifier from simple pooling of images from different locations; or learn multiple classifiers, each for a particular location and based on using images from that location only. Unfortunately, neither of the two are optimal, because the first ignores the difference between different locations and the second ignores the analogy between them.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Classification with Incomplete Data Using Dirichlet Process Priors

A non-parametric hierarchical Bayesian framework is developed for designing a classifier, based on a mixture of simple (linear) classifiers. Each simple classifier is termed a local "expert", and the number of experts and their construction are manifested via a Dirichlet process formulation. The simple form of the "experts" allows analytical handling of incomplete data. The model is extended to...

متن کامل

Instance Label Prediction by Dirichlet Process Multiple Instance Learning

We propose a generative Bayesian model that predicts instance labels from weak (bag-level) supervision. We solve this problem by simultaneously modeling class distributions by Gaussian mixture models and inferring the class labels of positive bag instances that satisfy the multiple instance constraints. We employ Dirichlet process priors on mixture weights to automate model selection, and effic...

متن کامل

Efficient Bayesian Task-Level Transfer Learning

In this paper, we show how using the Dirichlet Process mixture model as a generative model of data sets provides a simple and effective method for transfer learning. In particular, we present a hierarchical extension of the classic Naive Bayes classifier that couples multiple Naive Bayes classifiers by placing a Dirichlet Process prior over their parameters and show how recent advances in appro...

متن کامل

Multi-Task Classification for Incomplete Data

A non-parametric hierarchical Bayesian framework is developed for designing a sophisticated classifier based on a mixture of simple (linear) classifiers. Each simple classifier is termed a local “expert”, and the number of experts and their construction are manifested via a Dirichlet process formulation. The simple form of the “experts” allows direct handling of incomplete data. The model is fu...

متن کامل

Multi-Task Learning for Classification with Dirichlet Process Priors

Multi-task learning (MTL) is considered for logistic-regression classifiers, based on a Dirichlet process (DP) formulation. A symmetric MTL (SMTL) formulation is considered in which classifiers for multiple tasks are learned jointly, with a variational Bayesian (VB) solution. We also consider an asymmetric MTL (AMTL) formulation in which the posterior density function from the SMTL model parame...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2005